The optimized gradient method (OGM) was recently developed by optimizing thestep coefficients of first-order methods with respect to the function value.This OGM has a per-iteration computational cost that is similar to Nesterov'sfast gradient method (FGM), and satisfies a worst-case convergence bound of thefunction value that is twice smaller than that of FGM. Moreover, OGM was shownrecently to achieve the optimal cost function complexity bound of first-ordermethods (with either fixed or dynamic step sizes) for smooth convexminimization. Considering that OGM is superior to the widely used FGM forsmooth convex minimization with respect to the worst-case performance of thefunction decrease, it is desirable to further understand the formulation andconvergence analysis of OGM. Therefore, this paper studies a generalizedformulation of OGM and its convergence analysis in terms of both the functionvalue and the gradient. We then optimize the step coefficients of first-ordermethods in terms of the rate of decrease of the norm of the gradient. Thisanalysis leads to a new algorithm called OGM-OG that has the best knownanalytical worst-case bound on the decrease of the gradient norm amongfixed-step first-order methods.
展开▼